Slepian–Wolf Coding
   HOME

TheInfoList



OR:

__NOTOC__ In
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
and
communication Communication (from la, communicare, meaning "to share" or "to be in relation with") is usually defined as the transmission of information. The term may also refer to the message communicated through such transmissions or the field of inquir ...
, the Slepian–Wolf coding, also known as the Slepian–Wolf bound, is a result in
distributed source coding Distributed source coding (DSC) is an important problem in information theory and communication. DSC problems regard the compression of multiple correlated information sources that do not communicate with each other. By modeling the correlation b ...
discovered by
David Slepian David S. Slepian (June 30, 1923 – November 29, 2007) was an American mathematician. He is best known for his work with algebraic coding theory, probability theory, and distributed source coding. He was colleagues with Claude Shannon and Ri ...
and
Jack Wolf Jack Keil Wolf (March 14, 1935 – May 12, 2011) was an American researcher in information theory and coding theory. Biography Wolf was born in 1935 in Newark, New Jersey, and graduated from Weequahic High School in 1952. He received his under ...
in 1973. It is a method of theoretically coding two lossless compressed correlated
source Source may refer to: Research * Historical document * Historical source * Source (intelligence) or sub source, typically a confidential provider of non open-source intelligence * Source (journalism), a person, publication, publishing institute o ...
s.


Problem setup

Distributed coding is the coding of two, in this case, or more dependent sources with separate encoders and a joint
decoder Decoder may refer to: Technology * Audio decoder converts digital audio to analog form * Binary decoder, digital circuits such as 1-of-N and seven-segment decoders * Decompress (compression decoder), converts compressed data (e.g., audio/video/im ...
. Given two statistically dependent
independent and identically distributed In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usua ...
finite-alphabet random
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is calle ...
s X^n and Y^n, the Slepian–Wolf theorem gives a theoretical bound for the lossless coding rate for distributed coding of the two sources.


Theorem

The bound for the lossless coding rates as shown below: : R_X\geq H(X, Y), \, : R_Y\geq H(Y, X), \, : R_X+R_Y\geq H(X,Y). \, If both the encoder and the decoder of the two sources are independent, the lowest rate it can achieve for lossless compression is H(X) and H(Y) for X and Y respectively, where H(X) and H(Y) are the entropies of X and Y. However, with joint decoding, if vanishing error probability for long sequences is accepted, the Slepian–Wolf theorem shows that much better compression rate can be achieved. As long as the total rate of X and Y is larger than their joint entropy H(X,Y) and none of the sources is encoded with a rate smaller than its
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
, distributed coding can achieve arbitrarily small error probability for long sequences. A special case of distributed coding is compression with decoder side information, where source Y is available at the decoder side but not accessible at the encoder side. This can be treated as the condition that R_Y=H(Y) has already been used to encode Y, while we intend to use H(X, Y) to encode X. In other words, two isolated sources can compress data as efficiently as if they were communicating with each other. The whole system is operating in an asymmetric way (compression rate for the two sources are asymmetric). This bound has been extended to the case of more than two correlated sources by
Thomas M. Cover Thomas M. Cover ˆkoÊŠvÉ™r(August 7, 1938 – March 26, 2012) was an American information theorist and professor jointly in the Departments of Electrical Engineering and Statistics at Stanford University. He devoted almost his entire career to dev ...
in 1975, and similar results were obtained in 1976 by Aaron D. Wyner and
Jacob Ziv Jacob Ziv ( he, יעקב זיו; born 1931) is an Israeli electrical engineer who, along with Abraham Lempel, developed the LZ family of lossless data compression algorithms. Biography Ziv was born in Tiberias, British mandate Palestine, on 27 ...
with regard to lossy coding of joint Gaussian sources.


See also

*
Data compression In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression ...
*
Data synchronization Data synchronization is the process of establishing consistency between source and target data stores, and the continuous harmonization of the data over time. It is fundamental to a wide variety of applications, including file synchronization and m ...
**
Synchronization (computer science) In computer science, synchronization refers to one of two distinct but related concepts: synchronization of processes, and synchronization of data. ''Process synchronization'' refers to the idea that multiple processes are to join up or handshak ...
* DISCUS *
Timeline of information theory A timeline of events related to  information theory,  quantum information theory and statistical physics,  data compression,  error correcting codes and related subjects. * 1872 – Ludwig Boltzmann presents his H-theorem, an ...


References


Sources

* * *


External links


Wyner-Ziv Coding of Video
algorithm for video compression that performs close to the Slepian–Wolf bound (with links to source code). Error detection and correction {{telecom-stub